专利摘要:
system for integrally measuring clinical parameters of visual function, including a display unit (20) for representing a scene with a 3d object with varying characteristics such as virtual position and virtual volume of the 3d object within the scene; motion sensors (60) to detect the position of the user's head and the distance of the display unit (20); tracking sensors (10) to detect the user's pupil position and pupillary distance; an interface (30) for user interaction in the scene; processing means (42,44) for analyzing user response based on data from sensors (60,10) and interface (30), with characteristic variations of the 3d object; and based on the estimation of a plurality of clinical parameters of visual function related to binocularity, accommodation, eye motility and visual perception.
公开号:BR112019009614A2
申请号:R112019009614
申请日:2017-10-27
公开日:2019-08-13
发明作者:García Ramos Eva
申请人:E Health Technical Solutions S L;
IPC主号:
专利说明:

“SYSTEM TO FULLY MEASURE CLINICAL PARAMETERS OF VISUAL FUNCTION”
Technical Field of the Invention [0001] The present invention belongs to the area of systems and methods for measuring clinical parameters of visual function. More specifically, it refers to techniques that use virtual reality immersively to measure this type of parameters.
State of the Art [0002] Currently, the measurement of this type of clinical parameters of visual function requires a clinical specialist with a session in which the patient is tested using various tests and optotypes. It is common for the personal and manual component of the measurement process to provide subjective results, which are difficult to reproduce and are merely qualitative.
[0003] On the other hand, measurements are performed independently based on the visual function to be tested. This makes the results sometimes invalid, since the influence of other factors is not considered. For example, it is known that patients usually compensate for a specific anomaly or compromise their visual function with the full use of other visual functions.
[0004] In short, currently the patient's ability to adapt is not taken into account and, therefore, actions aimed at correcting a specific anomaly can result, in practice, in a global worsening of the patient's vision. In addition, measurements and tests on the patient are affected by the subjectivity of the specialist who performs them, thus significantly limiting the reproducibility and consistency of the experimental results obtained.
Brief Description of the Invention
Petition 870190044371, of 05/10/2019, p. 49/68
2/12 [0005] The present invention relates to a system to fully measure the parameters of ocular, oculomotor and visual functions, preferably in real time and to generate therapy and training to improve visual function.
[0006] For this purpose, tracking sensors that detect the position of the user's pupils, a three-dimensional (3D) display unit reproducing certain scenes for the user with 3D objects with predetermined properties in relation to size, shape, color, speed, etc. were selected based on the type of test that should be performed as part of the session. Motion sensors detect the user's movements so that the display unit can adapt the scene and provide it with an immersive character.
[0007] The system also includes an interface with which the user can interact. In particular, the interface receives user commands to interact with the display system. These commands can be registered in many different ways (via control buttons, voice commands, gloves, etc.).
[0008] The system also implements processing means that manage the display unit, the sensor and the interface in a coordinated manner. Thus, the user's response to visual stimuli generated in the display unit is detected by the sensor and transmitted to the processing media to measure clinical parameters.
[0009] An important point in the present revelation lies in the technology based on virtual reality that allows to generate environments for interaction with the user. In particular, what is sought is the ability to immerse in the virtual environment. This is particularly interesting for creating conditions that are similar to the real ones for the user, and that thus allow their reproduction again and again if desired. For this purpose, it is necessary that the display unit be
Petition 870190044371, of 05/10/2019, p. 50/68
3/12 coupled to the motion sensors carried by the user. In some modalities, this may be virtual reality glasses, in others, a 3D screen and polarized glasses. In any case, the coupling of the motion sensors and the display unit allows the 3D image to be shown adapted to the person's movement or position, making the user feel in motion throughout the displayed virtual environment, that is, immersed it preferably with a minimum visual field of 60 ° in order to be able to properly perform the assessment, therapy and training of visual function. To achieve the above, precise coordination is important between the elements participating in a session. Thus, the user is first introduced with a 3D Virtual Reality environment designed for the user to be immersed in it. In the referred 3D environment, some 3D objects will be displayed by means of “optotypes” that are intended to act as the stimulus on which the user should focus his vision and that are related to the test to be performed.
Brief description of the figures [0010] Figure 1 shows a simplified block diagram, according to a possible embodiment of the invention.
[0011] FIGS. 2A, 2B show an example of a measurement of a healthy user looking at a scene with a moving 3D object.
[0012] FIGS. 3A, 3B show an example of a measurement of a user who has a dysfunction who is looking at a scene with a moving 3D object.
[0013] Figure 4 shows a summary diagram with the general steps being implemented in one modality.
Petition 870190044371, of 05/10/2019, p. 51/68
4/12
Detailed description of the invention [0014] An exemplary non-limiting embodiment is explained in more detail with reference to the previous figures.
[0015] FIG. 1 illustrates a system for integral measurement of clinical parameters of visual function in real time, including various components. There is a tracking sensor 10 that is used to periodically detect the position of the user's pupils. Thus, not only the direction changes, but also the speed can be measured. Generally, the tracking sensor 10 allows you to measure various parameters based on the specific test being performed in the session. FIGs. 2 and 3 illustrate this aspect of the invention in greater detail. For example, tracking sensor 10 can take values for the position of the right and left eye, the position of the object the user is looking at (through both eyes and separately), eye-sensor distance, pupil size, distance interpupillary, eye movement speed, etc. Generally, to perform the measurement operation, the tracking sensor 10 comprises a pair of cameras designed to be focused on the user's eyes and to capture their movements and position. This requires a sampling frequency that is high enough to capture rapid eye movements. It must also calculate the position within the generated virtual environment that the user is looking at. The tracking sensor 10 is essential for correct optometric measurement. A large number of dysfunctions are detected through anomalous eye movements against various stimuli. For the sake of clarity, FIGs. 2 and 3 show examples of how the measurements taken by sensors 10, 60 are associated with a visual condition of the user, with or without possible dysfunction, respectively.
[0016] A display unit 20 with immersive 3D features reproduces or projects user scenes with depth features, including 3D objects with predetermined properties in relation to size, shape, color, location on the
Petition 870190044371, of 05/10/2019, p. 52/68
5/12 scenario, user distance, whether standing or moving, etc. These scenes, including 3D objects, work as optotypes and can be selected in the system according to the type of test to be performed, allowing to generate certain visual stimuli in the user. Thus, a plurality of scenes can be designed for the user having different visual challenges and stimuli, whether for assessment, therapy or visual function training.
[0017] The system also includes an interface 30 for user interaction. In particular, the interface receives user commands to control the display unit 20 and other elements of the system. The interface 30, in turn, can also transmit instructions for the test to the user. Thus, the system can measure the response to the user's actions (movement, position in the 3D environment, pressing buttons, etc.).
[0018] The system also includes processing means 40, preferably implemented as a server 42, and a terminal 44 that coordinately shares the management of the display unit 20, the sensor 10 and the interface control 30, so that the visual responses of the user can be detected by sensor 10 and transmitted to server 42 for measurement of clinical parameters of visual function. In addition, the display unit 20 allows the adaptation of the 3D image represented according to the user's movement. The display unit 20 may include a decoupling system (such as polarized glasses or the like).
[0019] The test is preferably initiated through the 3D interface. When visualizing a concrete scene, the visual stimuli of the user that were detected by sensor 10 at a given moment, are associated with the 3D objects represented at that moment in the display unit 20. These changes in the position of the user's pupils are detected and combined with the movements
Petition 870190044371, of 05/10/2019, p. 53/68
6/12 made by the user's head that are detected by means of a movement sensor 60. The coupling of the movement sensors 60 and the display unit 20 allows to show the 3D image to be adapted to the movement or the position of the person, making with which the user really feels in movement through the said virtual environment in which he is immersed.
[0020] The data are processed and the properties of the 3D objects are associated with the generated stimuli detected by the sensors 10, 60. This allows to measure clinical parameters of the visual function under reproducible and controllable conditions. Thus, through the adequate processing of the data being obtained, the user's visual behavior, eye movement, convergence, etc., can be known. In addition, the clinical parameters of visual function can be compared to an expected range, in order to assess whether there are any problems.
[0021] As indicated, together with the visualization of the 3D objects in the display unit 20, the tracking sensor 10 tracks the user's gaze in the said Virtual Reality environment. Tracking sensor 10 records:
- The position of the eyes (left and right).
- Location to which each eye looks (separately).
- Location in which the user looks using both eyes in combination in the 3D environment.
[0022] Also, at the same time, instructions can be shown to guide users, explaining what they must do at each moment. These instructions can be through text or audio through an interface 30. The said interface 30 also allows the user to interact with 3D objects of the scene being represented by the display unit 20. The interaction with the user starts at that moment and the answers given the stimuli shown, which are the measurements, must be recorded.
Petition 870190044371, of 05/10/2019, p. 54/68
7/12 [0023] These user responses can, for example, among others, be through:
- Movement of the device (in any direction in space).
- Position of the device within the Virtual Reality environment.
- Pressing the device buttons.
- Voice commands.
[0024] In the situation described above, for the previous tasks, the process is preferably performed on a client terminal 44, although these have been provided (downloaded) from an external server 42. A distributed environment allows to reduce the technical requirements of the terminal 44, centralized control of tests performed on different users, access to statistical data, etc. For example, the heaviest operations and calculations can be performed on server 42 that unloads the processing workload from terminal 44. Likewise, the characteristics that can be established for a test can be defined from server 42 :
- The Virtual Reality environment to be used.
- 3D objects and their characteristics (size, distance, colors, movement)
- What instructions to give the user.
- When capturing information with the tracking sensor 10.
- When capturing information with the user interface 30.
- What data should be recorded and generated as a result of the execution task.
[0025] As for the data to be registered, there are data from sensors 10, 60 and also through the interaction with the user with the interface 30.
[0026] Once all the local processing of data is finished, they are grouped and
Petition 870190044371, of 05/10/2019, p. 55/68
8/12 sent to server 42 for storage and subsequent analysis. Thus, statistics, new tests, recommendations, therapies, etc., can be performed.
[0027] For example, it can be checked whether the values obtained for certain parameters are within the tolerance limits according to scientific studies stored on server 42. On the other hand, a new scene can be designed as a recommendation that acts as therapy or training to improve some of the features for which the test provided a worse result.
[0028] FIGS. 2A and 2B illustrate an example in which a user interacts with the system in two moments of time. In an initial instant t = ti, the system represents a 3D model in the display unit 30 that corresponds to a train that runs along a railway line.
[0029] The user carries a motion sensor 60 to register, in both moments of time, the movements of the head (Xic, yic), (Xfc, Yfc) and the distance Die, Dfc with the display unit 20. From similarly, a tracking sensor 10 records the user's pupillary movements at both times, providing more information about the position of both pupils. Right: (xi1, yi1, zi1), (xf1, yf1, zf1); Left: (xi2, yi2, zi2), (xf2, yf2, zf2).
[0030] On the other hand, the display unit 20 represents the 3D object in two different virtual positions (xio, yio, zio), (xfo, yfo, zfo) and with two different volumes at each instant of time Vio, Vfo. Other properties, such as the color of the 3D object, may vary depending on the test to be performed in the session.
[0031] When the processing of the above values occurs, it is verified that the user's eyes are properly coordinated with the movement of the 3D object in the
Petition 870190044371, of 05/10/2019, p. 56/68
9/12 scene. Visual behavior corresponds to a healthy individual.
[0032] FIGS. 3A and 3B schematically illustrate the above case, where the user's visual behavior does not respond adequately to stimuli. As can be seen, in FIG. 3A, the user does not correctly align the visual axis of his left eye (xi2, yi2, zi2) on the object of interest (Vio), revealing a limitation of his binocular vision (strabismus). In this case, the angle of deviation (Fig. 38) is kept constant, moving the object of interest (Vto), indicating a compliant condition, that is, it has the same angle of deviation in different positions of the eye. This information is essential to determine the severity of the condition, as well as the type of recommendation for visual therapy, in order to restore the individual's binocularity.
[0033] Clearly, the chosen scene is just an example. Others could be an aquarium with fish of different shapes, colors and sizes, which continue to appear and disappear; a road with a car approaching the user; holes with moles coming out randomly, etc. In these scenes, the parameters can be measured objectively and also all together (without underestimating an existing influence between them).
[0034] FIG. 4 briefly illustrates a possible sequence of actions during the operation of the system for a test. In a first step, 50 relevant personal information of the user is recorded. Preferably, the data entered are: sex, age, habits, etc., for which interface 30 can be used. The user, via terminal 44, places a request to server 42 as a client and the application associated with the selected test type is installed.
[0035] The user stands in front of the display unit 20, instructions are given to the user via interface 30 or display unit 20 in order to
Petition 870190044371, of 05/10/2019, p. 57/68
10/12 position the tracking sensor 10 correctly or seat it in the appropriate position in relation to the display unit according to the motion sensor 60. Then, in step 51, a scene related to the test to be selected is represented , with one or more 3D objects whose properties change over time or with user interaction.
[0036] The user and the display unit interact via interface 30 in step 52. In general, the user can receive instructions during the test, using graphics and audio. For example, interface 30 can incorporate any element to facilitate user interaction with 3D objects from scene 30 represented in display unit 20.
[0037] Sensors 10, 60 detect the values in step 53 while the scene is playing. This data must be sent with minimal latency to terminal 44.
[0038] Terminal 44 receives the captured data and pre-processes it and sends it to server 42 in order to obtain values of clinical parameters of the visual function in step 54.
[0039] Once the test or session with different tests is finished, terminal 44 sends the data to be obtained to server 42 for storage and further processing. In particular, the parameters are compared with an expected range for the user profile in step 55.
[0040] When server 42 processed the data to be obtained, it relates to a possible dysfunction in step 56.
[0041] Finally, the server generates possible recommendations to improve the
Petition 870190044371, of 05/10/2019, p. 58/68
11/12 dysfunction in step 57 and transmits them to terminal 44, in order to show them to the user, along with the results obtained.
[0042] Thanks to the technology used, the tests are performed objectively and completely, in a personalized way, and allow to identify different visual dysfunctions. Especially those that limit the ability to align the eyes on the object of interest (insufficient convergence, excessive divergence, inflexibility of vergences) or ability to focus (insufficient accommodation, excess accommodation, inflexibility of accommodation) or limitation to change the looking from one object to another (saccadic eye movements) or tracking an object (smooth search movements) or the visual-perceptual skills necessary to identify and manage information about our environment. All of them can be evaluated and trained in a personalized way (not only based on the conditions under which the test was performed, but also on the development of each work session). On the other hand, a wide variety of visualizations can be provided for the same exercise, which allows for a better adaptation to daily visual demands and to maintain interest and attention in performing the exercises.
[0043] It should be noted that the present invention is not only useful for identifying dysfunctional, but also for training the physical and technical activities of healthy users through visual stimuli and challenges. This can be applied directly to sportspeople and children, and can be extended to both specific professionals (drivers, drivers, ...) and amateurs (skills in handling miniatures, entertainment games, ...) visual demands.
[0044] It should be noted that one of the advantages of the present disclosure is that only a small space is needed to accommodate the necessary devices. For example, for a modality using a screen configured as
Petition 870190044371, of 05/10/2019, p. 59/68
12/12 a display unit 20, everything can be placed on a table at a distance from the user, who will preferably be seated, between 50 cm and 1 m, together with a computer as an element to be part of the processing means 40. The rest of the elements are carried by the user in the head and hands (control devices, gloves, ...). Even fewer elements, in the case of a modality in which the display unit 20 is a pair of virtual reality glasses.
权利要求:
Claims (13)
[1]
1. System for fully measuring clinical parameters of visual function, characterized by the fact that it comprises:
- a display unit (20) configured to represent a scene in which at least one 3D object has variable characteristics to promote a visual response in the user, in which said variable characteristics include at least the virtual position (Χο, Υο, Ζο) and the virtual volume (Vo) of the 3D object within the scene;
- a plurality of motion sensors (60) configured to detect the position of the user's head (X c , Yc) and the distance (D c ) from the display unit (20);
- a plurality of tracking sensors (10) configured to detect the position of the user's pupils (x P , y P , z P ) and pupil diameter (d P );
- an interface (30) configured to allow the user to interact in the scene;
- processing means (42, 44) configured to analyze the user's response based on:
associate the data from the sensors (60,10) and the interface (30) to the variation of the characteristics of the 3D object represented in the display unit;
estimate a plurality of clinical parameters of the user's visual function.
[2]
2. System according to claim 1, characterized in that the characteristics are variable as a function of time according to a predetermined schedule.
[3]
3. System according to claim 1 or 2, characterized in that the variable characteristics also include color of the 3D object.
[4]
4. System according to any one of claims 1 - 3, characterized in that the characteristics are variable depending on the interaction of the user through the interface (30).
Petition 870190044371, of 05/10/2019, p. 61/68
2/3
[5]
System according to claim 4, characterized in that the interface (30) comprises at least one of the following: a digital pen, a glove, a control device or the like.
[6]
System according to any of claims 1 - 5, characterized in that the display unit (20) comprises a 3D screen.
[7]
System according to any of claims 1 - 6, characterized in that the display unit (20) comprises Virtual Reality glasses.
[8]
System according to any of claims 1-7, characterized in that the display unit (20) comprises a dissociation system.
[9]
9. System according to any one of claims 1 - 8, characterized by the fact that the processing means (42, 44) are also configured in order to compare the estimated clinical parameters of the visual function with a stored range of values of reference and to establish a possible visual dysfunction based on the comparison.
[10]
10. System according to claim 8 or 9, characterized in that the comparison with a range of reference values is performed based on the user profile that includes at least age information.
[11]
System according to any of claims 1 - 10, characterized in that the processing means comprise a client terminal (44) and a server (42), in which the client terminal (44) is configured to receive and process the data measured by sensors (10, 60) and send it to the server (42).
Petition 870190044371, of 05/10/2019, p. 62/68
3/3
[12]
12. System, according to claim 11, characterized by the fact that the server (42) is configured to compare the values with a database with reference values.
[13]
13. System according to any of claims 1 - 12, characterized by the fact that the visual function of clinical parameters refers to at least one of the following: binocularity, accommodation, ocular motility or visual perception.
类似技术:
公开号 | 公开日 | 专利标题
BR112019009614A2|2019-08-13|system to fully measure clinical parameters of visual function
Gibaldi et al.2017|Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research
Wass et al.2014|Robustness and precision: How data quality may influence key dependent variables in infant eye‐tracker analyses
Laurens et al.2013|Computation of linear acceleration through an internal model in the macaque cerebellum
Bertolini et al.2011|Velocity storage contribution to vestibular self-motion perception in healthy human subjects
US20170169716A1|2017-06-15|System and method for assessing visual and neuro-cognitive processing
US10643741B2|2020-05-05|Systems and methods for a web platform hosting multiple assessments of human visual performance
EP3307135B1|2021-09-22|Methods and systems for testing aspects of vision
Serrano-Pedraza et al.2011|Visual suppression in intermittent exotropia during binocular alignment
Denniss et al.2014|Structure–function mapping: variability and conviction in tracing retinal nerve fiber bundles and comparison to a computational model
US9504380B1|2016-11-29|System and method for assessing human visual processing
CN112053781A|2020-12-08|Dynamic and static stereoscopic vision testing method and terminal
JP5718494B1|2015-05-13|Impression estimation device, method thereof, and program
Vingerhoets et al.2007|Verticality perception during off-vertical axis rotation
JP5718495B1|2015-05-13|Impression estimation device, method thereof, and program
Champion et al.2010|Discrimination contours for the perception of head-centered velocity
JP2017086529A|2017-05-25|Impression estimation device and program
Yoshikawa et al.2015|Effects of two-minute stereoscopic viewing on human balance function
KR20200069954A|2020-06-17|Method for analyzing element of motion sickness for virtual reality content and apparatus using the same
Köles et al.2014|Experiences of a combined psychophysiology and eye-tracking study in VR
US20210045628A1|2021-02-18|Methods, systems, and computer readable media for testing visual function using virtual mobility tests
JP6445418B2|2018-12-26|Impression estimation device, impression estimation method, and program
Sasaoka et al.2019|Ease of hand rotation during active exploration of views of a 3-D object modulates view generalization
McCaslin et al.2020|Stereotest comparison: efficacy, reliability, and variability of a new glasses-free stereotest
Shibata et al.2018|Role of the right dorsolateral prefrontal cortex in the cognitive process of matching between action and visual feedback
同族专利:
公开号 | 公开日
RU2754195C2|2021-08-30|
RU2019116179A|2020-12-10|
WO2018087408A1|2018-05-17|
CN110167421A|2019-08-23|
AU2017359293A1|2019-06-06|
JP2019535401A|2019-12-12|
KR20190104137A|2019-09-06|
EP3320829A1|2018-05-16|
CA3043276A1|2018-05-17|
IL266461D0|2019-06-30|
CN110167421B|2022-03-04|
RU2019116179A3|2021-02-09|
US20190254519A1|2019-08-22|
EP3320829A8|2018-07-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

CN100477734C|2001-07-06|2009-04-08|帕朗特研究公司|Imaging system and method employing reciprocal space optical design|
EP1691670B1|2003-11-14|2014-07-16|Queen's University At Kingston|Method and apparatus for calibration-free eye tracking|
UY28083A1|2003-11-14|2003-12-31|Nicolas Fernandez Tourn Suarez|VESTIBULAR REHABILITATION UNIT|
US20060005846A1|2004-07-07|2006-01-12|Krueger Wesley W|Method for balance enhancement through vestibular, visual, proprioceptive, and cognitive stimulation|
JP4890060B2|2005-03-31|2012-03-07|株式会社トプコン|Ophthalmic equipment|
CN105212890B|2006-01-26|2017-04-26|诺基亚公司|Eye tracker equipment|
JP2008012223A|2006-07-10|2008-01-24|Nippon Telegr & Teleph Corp <Ntt>|Device for adding function for game table having sight line analyzing and tracking function, and method for adding function for game table|
US9149222B1|2008-08-29|2015-10-06|Engineering Acoustics, Inc|Enhanced system and method for assessment of disequilibrium, balance and motion disorders|
TWI357320B|2007-09-17|2012-02-01|
US9788714B2|2014-07-08|2017-10-17|Iarmourholdings, Inc.|Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance|
CN101727531A|2008-10-16|2010-06-09|国际商业机器公司|Method and system used for interaction in virtual environment|
WO2012125172A1|2011-03-15|2012-09-20|Kona Medical, Inc.|Energetic modulation of nerves|
US9345957B2|2011-09-30|2016-05-24|Microsoft Technology Licensing, Llc|Enhancing a sport using an augmented reality display|
FR2987920B1|2012-03-08|2018-03-02|Essilor International|METHOD FOR DETERMINING A GEOMETRIC-MORPHOLOGICAL, POSTURE OR BEHAVIORAL CHARACTERISTIC OF A BEARER OF A PAIR OF EYEWEAR|
AU2012393196A1|2012-10-22|2015-10-01|Realvision S.R.L.|Network of devices for performing optical/optometric/ophthalmological tests, and method for controlling said network of devices|
JP6726965B2|2012-11-26|2020-07-22|コナン・メディカル・ユーエスエイ・インコーポレイテッドKonan Medical USA, Inc.|Class turbo ray method and apparatus|
US20160235323A1|2013-09-25|2016-08-18|Mindmaze Sa|Physiological parameter measurement and feedback system|
CN104382552B|2014-11-27|2016-04-27|毕宏生|A kind of comprehensive visual function detection equipment|
CN104545787B|2014-12-12|2017-02-01|许昌红|Wearable pupil light reflex measurement equipment|
JP6663441B2|2015-03-01|2020-03-11|ノバサイト リミテッド|System for measuring eye movement|
CN107645921B|2015-03-16|2021-06-22|奇跃公司|Methods and systems for diagnosing and treating health disorders|
CN105832502B|2016-03-15|2018-01-02|广东卫明眼视光研究院|Intelligent vision functional training and instrument for training|EP3062142B1|2015-02-26|2018-10-03|Nokia Technologies OY|Apparatus for a near-eye display|
CN106569339B|2016-11-08|2019-11-15|歌尔科技有限公司|The control method of VR helmet and VR helmet|
US10650552B2|2016-12-29|2020-05-12|Magic Leap, Inc.|Systems and methods for augmented reality|
EP3343267A1|2016-12-30|2018-07-04|Nokia Technologies Oy|Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light|
KR20200100720A|2017-12-20|2020-08-26|매직 립, 인코포레이티드|Insert for augmented reality viewing device|
CN112136152A|2018-03-15|2020-12-25|奇跃公司|Image correction caused by deformation of components of a viewing device|
EP3803488A4|2018-05-30|2021-07-28|Magic Leap, Inc.|Compact variable focus configurations|
EP3804306A4|2018-06-05|2022-03-02|Magic Leap Inc|Homography transformation matrices based temperature calibration of a viewing system|
EP3803545A4|2018-06-08|2022-01-26|Magic Leap Inc|Augmented reality viewer with automated surface selection placement and content orientation placement|
JP2021533465A|2018-08-02|2021-12-02|マジック リープ, インコーポレイテッドMagic Leap, Inc.|Visualization system with interpupillary distance compensation based on head movement|
US10795458B2|2018-08-03|2020-10-06|Magic Leap, Inc.|Unfused pose-based drift correction of a fused pose of a totem in a user interaction system|
JP2022509770A|2018-11-16|2022-01-24|マジック リープ, インコーポレイテッド|Clarification triggered by image size to maintain image sharpness|
KR102184972B1|2019-02-01|2020-12-01|주식회사 룩시드랩스|Apparatus and method for measuring distance between pupil centers|
WO2020240052A1|2019-05-29|2020-12-03|E-Health Technical Solutions, S.L.|System for measuring clinical parameters of the visual function|
CN112866679B|2021-04-23|2021-08-10|广东视明科技发展有限公司|Multi-point stereoscopic vision detection method in motion state|
法律状态:
2020-01-14| B25G| Requested change of headquarter approved|Owner name: E-HEALTH TECHNICAL SOLUTIONS, S.L. (ES) |
2021-10-05| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
EP16382521.9A|EP3320829A1|2016-11-10|2016-11-10|System for integrally measuring clinical parameters of visual function|
PCT/ES2017/070721|WO2018087408A1|2016-11-10|2017-10-27|System for integrally measuring clinical parameters of visual function|
[返回顶部]